|
|
Introduction to Android sensors |
| Tags | other☁android☁opensignals mobile☁android sensor basics |
The
OpenSignals mobile application
(
Google Play link
) allows to acquire data from the internal sensors that are built into hardware of an Android smartphone.
In this Jupyter Notebook we will have a detailed look at the sensor types that are part of the Android system. We will explore how the Android system acquires data for each sensor and delve into the limitations of the Android system when acquiring data. The focus of this Jupyter Notebook will be the data returned by the OpenSignals mobile application and the Android system.
Thus, the functions used here will not always be explained in detail. If you want to have a more detailed look on the functions used, you can always visit our
GitHub biosignalsnotebooks
repository
. We will also provide direct links to each function used. If some context on the functions behaviour is needed, it will be carefully given.
In case you want to have a more in depth look on how Android handles their sensors, then you can have a look at their
developers page
or their
source page
on Android sensors.
1 - Package imports
First, lets import some useful libraries that will be used for visualisation purposes.# biosignalsnotebooks package
import biosignalsnotebooks as bsnb
# package for using operating system dependent functionality
import os
# numpy package
from numpy import cumsum
2 - How the Android operating system acquires sensor data and its limitations
The Android system acquires sensor data through an event based system. This means that the operation system waits for an event to happen and when the event occurs, the system picks up on it and handles it accordingly. The event structure used for acquiring sensor data is called a sensor event . One major limitation of these sensor events is that these may not occur at fixed time intervals. The reason for this is that the Android operating system tries to optimize the battery consumption of the phone and thus only registers these events when it is really needed. Thus, the data acquired from internal Android sensor with the OpenSignals mobile application may most likely not be sampled equidistantly.The influence of the Android operating system on the sampling capacity of its internal sensors also has other implications. Mainly, that when setting a sampling rate for the Android sensors in the OpenSignals mobile application the system only uses this rate as a suggestion but may sample at lower or higher rates. When other applications, that make use of internal sensors, are running at the same time, the sampling capacity may also be altered as well. Additionally, due to the event based system, multiple sensors are not necessarily sampled at the same time instant. This has the effect that sensors may start and stop recording at different times. In order to know when the Android system picked up on a sensor event , a timestamp is associated with each event. This timestamp is the elapsed time since the last boot of the phone. The time is counted in nanoseconds.
Other aspects such as the hardware of a phone need to be accounted as well. Android supports a variety of different phone manufactures. This has the consequence that each phone model released by a manufacturer may include different sensor types with their particular specifications. Therefore, doing a recording with two distinct phones may result in different data recordings.
Summing it up, we can thus state the following important facts:
These facts, should always be kept in mind when acquiring data from Android sensors. However, some of these limitations can be overcome with some simple post-processing steps. For instance, the non-equidistant sampling can be re-aligned through a resampling of the data. We show how this is achieved in our
Resampling of signals recorded with Android sensors
notebook about resampling signals recorded with Android sensors.
3 - The phones coordinate system
The Android system uses a standard 3-axis coordinate system to define its position. Most sensors use the coordinate system as shown in the image on the right. Assuming the phone is held in the hand in a parallel line with the face, while the screen is facing towards the face, The directions of the three axes are defined as follows:
4 - Android sensor reporting modes
Android defines four types of reporting modes for their sensors. These are:Depending on their reporting mode, the sensors express different sampling behaviours. Most Android sensors are continuous , however we will describe the reporting mode of each sensor in a later section.
5 - Loading the data, getting useful information and plotting the sensor timeline
Let us have a look at some data acquired from Android sensors using the OpenSignals mobile application . The data shown in this and the following sections has been acquired using a Xiaomi Mi A1 while taking a walk outside. The sampling rate was set to 100 Hz .
5.1 - Loading the data and getting data report
The facts stated in the previous sections can be easily affirmed by having a look at the data. Thus, before we are going to explore the details of each sensor, let us first load the data using the load_android_data(...)# set file path
path = '../../images/other/intro_to_android_sensors/'
# get a list with all the files within that folder
file_list = os.listdir(path)
# make full path for each file
file_list = [path + file for file in file_list if ".txt" in file]
# load the sensor data and print a data report
sensor_data, report = bsnb.load_android_data(file_list, print_report=True)
5.2 - Plotting the sensor acquisition timeline
In case we want to have visual insight into when each sensor acquired its samples, we can use the plot_android_sensor_timeline(...)This function takes the following inputs:
For our purpose, we will only plot the first 10 seconds of the data. The line thickness is set to 1.5 in order to properly distinguish most of the lines from each other.
Looking at the plot generated by the function, we can affirm that the system does not sample with a fixed rate. It is also again visible that the sensors start recording at different times. Additionally, the plot already gives us a hint on what kind of reporting mode each sensor has. The Accelerometer sensor, for example, is most likely a
Continuous
sensor while the Light sensor probably is an
On Change
sensor.
bsnb.plot_android_sensor_timeline(sensor_data, report, plot_until_seconds=10, line_thickness=1.5)
6 - A closer look at the individual sensors
Android divides their sensors into three major categories. We added a fourth one in which we currently place the GPS sensor. These are:In the following sections we will explore the sensors within each category and show what kind of data each sensor records. For each presented sensor a plot of the data will be shown if it is appropriate. Since we do not want to overcrowd the plots with data, only a single channel of the data (in some cases two channels) will be plotted. For sensors, where a plotting of the data does not make to much sense, the ".txt" file is presented.
To have the same time axis as in the sensor acquisition timeline plotted above, we will shift the time axes of each sensor according to the timestamp of the sensor that first started recording and convert the time from nanoseconds to seconds.
# get the earliest timestamp of the entire recording
start_time = min(report['starting times'])
# make a copy of the sensor_data list
shifted_sensor_data = sensor_data.copy()
# cycle through the sensor data list
for data in shifted_sensor_data:
# check the dimensionality of the data (the significant motion sensor, for example is one dimensional)
if(data.ndim == 1):
# get the time axis
time_axis = data[:1]
else: # multidimensionl array
# get the time axis
time_axis = data[:, 0]
# shift time axis to start at zero and convert to seconds
time_axis = time_axis - start_time
time_axis = time_axis * 1e-9
# override the time_axis in the data array
if(data.ndim == 1):
data[:1] = time_axis
else:
data[:,0] = time_axis
6.1 - Motion Sensors
Motion sensors can be used to track the movement of the device. These may include such movements like tilt, shake, rotation or swing. Depending on which sensor is used, either the motion relative to device"s coordinate system or the motion relative to the world"s coordinate system is measured.The sensors that are part of this category are:
Accelerometer (Continuous):
The accelerometer measures the acceleration force in m/s
2
, including the force of gravity, that is applied to a device on all three physical axes (x, y, and z).
Uncalibrated Accelerometer (Continuous):
The uncalibrated accelerometer measures the acceleration in m/s
2
along all three physical axes (x, y, and z) without bias compensation (factory bias and temperature compensation are applied to uncalibrated measurements). Additionally, it provides a bias estimation. This means that the uncalibrated accelerometer has a total of six data channels.
Linear Accelerometer (Continuous):
The linear accelerometer measures the acceleration in m/s
2
, excluding gravity, along all three physical axes (x, y, and z).
Gravity (Continuous):
The gravity sensor measures the force of gravity in m/s
2
that is applied to a device along all three physical axes (x, y, z).
Gyroscope (Continuous):
The gyroscope measures the device"s rate of rotation in rad/s around each of the three physical axes (x, y, and z).
Uncalibrated Gyroscope (Continuous):
The uncalibrated gyroscope measures the device"s rate of rotation in rad/s around each of the three physical axes (x, y, and z) without any drift compensation. Additionally, it provides the estimated drift for each axis. Thus, the the uncalibrated gyroscope has a total of six data channels.
Rotation Vector (Continuous):
The rotation vector is a so called
attitude composite
sensor. This means that the sensor is based on a composition of other sensors. This sensor is derived from the accelerometer, magnetic field and gyroscope sensors. It defines the device"s orientation relative to an East-North-Up coordinate frame. In this frame the x-axis points east and is parallel to the ground, the y-axis is parallel to the ground as well and points north, and the z-axis is perpendicular to the ground pointing upwards to the sky.
The rotation of the phone is relative to this system and can be seen as rotating the phone by an angle $\theta$ around a rotation axis. The Android system provides the coordinates of this rotation as the four unit-less components (x, y, z, and w) of a unit quaternion, where w is the scalar component of that quaternion.
The components can be described as the following
In case you never heard of a quaternion before, we recommend watching this introductory
video
.
Significant Motion (One Shot):
Due to its
One shot
reporting mode, the significant motion sensor triggers an event each time a significant motion is detected. After triggering the event it disables itself. A significant motion is a motion that might lead to a change in the user"s location for example walking, biking, or sitting in a moving car.
In the data we recorded a significant motion was detected only once. This motion was registered when the person recording the data started walking.
Step Counter (On Change):
The step counter counts the number of steps taken by the user since the last reboot while the sensor was activated. The step counter is only reset to zero when a system reboot is performed. The step counter usually has more latency (up to 10 seconds) but more accuracy than the step detector sensor.
In case you want to count the number of steps since the start of the acquisition, then just subtract the first value of the data array from all values present in that array, as shown below.
# get the data of the step counter
step_counter = sensor_data[-2] # in our case the step_counter is the penulitmate value in the sensor_data list
# shift values to start at zero steps
steps_since_recording_start = step_counter[:,1] - step_counter[0, 1]
Step Detector (Special Trigger):
Similar to the step counter, the step detector sensor triggers an event each time the user takes a step. However, instead of reporting the number of steps taken, the step counter just reports a 1.0 for each triggered event. The latency is expected to be below 2 seconds.
If you want to have a similar way of displaying as the step counter, you can simply calculate the cumulative sum over the data, as presented below.
# get the data of step detector
step_detector = sensor_data[-1] # in our case the step detector is the last value in the sensor_data list
# get the values of the detector
trigger_vals = step_detector[:,1]
# calculate the cumulative sum (cast to int because otherwise the values are floats)
steps = cumsum(trigger_vals, dtype=int)
# print the first twenty steps
print('steps: {}'.format(steps[:20]))
6.2 - Position Sensors
Position sensor can be used to track the phones position relative to the world"s coordinate system.
Game Rotation Vector (Continuous):
The game rotation vector is similar to the rotation vector presented above. The difference being that the sensor is only derived from a accelerometer and a gyroscope. Thus, the y-axis does not point north, but to some other reference instead. It has the same data fields as the rotation vector. The data of the sensor is unitless.
Geomagnetic Rotation Vector (Continuous):
As the name suggests, this sensor also has similarities to the rotation vector. The difference here is that the geomagnetic rotation vector is derived from an accelerometer and a magnetic field sensor. It has the same data fields as the rotation vector. The data of the sensor is unitless.
Magnetic Field (Continuous):
The magnetic field sensor measures the geomagnetic field strength in $\mu$T along all three physical axes (x, y, and z).
Uncalibrated Magnetic Field (Continuous):
The uncalibrated magnetic field sensor measures the geomagnetic field strength in $\mu$T along all three physical axes (x, y, and z). Additionally it also reports an estimation for the hard iron bias along each axis. This means that similar that the sensor has a total of six data channels.
Proximity (On Change):
The proximity sensor reports the distance from the sensor to the closest visible surface in cm. Depending on the phone model you are using the sensor either reports a continuous spectrum of distance values or a two value spectrum consisting of a value for
close
and a value for
far
. The phone that we used only reports two values, as can be seen in the ".txt" file of the sensor.
Since the sensor is an on change sensor, the sensor will only trigger an event when there is a significant change in distance. this means, that while the phone is situated in a pocket, on a surface or not moving, the sensor will not report any new values. This behaviour can also be seen in the ".txt" file of the sensor. The sensor only registered a change in distance three times.
6.3 - Environment Sensors
Environment sensors provide insight on the environment in which the user (or the phone) is located.
Ambient Temperature (On Change):
The ambient temperature sensor measures, as the name suggest, the ambient temperature of the location in which the user (or phone) is situated. Unfortunately, we can not provide any data on this sensor, since the phone used does not support this sensor. However, due to its declaration as a sensor with an
On Change
reporting mode, we can deduce that it has most likely a similar behavior like the proximity sensor or any other
On Change
sensor.
Light (On Change):
The light sensor measures the illuminance of the environment in lux. Due to its
On Change
reporting mode, the sensor is only triggered when there is a significant change in lighting (i.e. when the user pulls the phone out of the pocket, the user changes from a brightly lit environment into more darker environment, etc.).
Pressure (Continuous):
The pressure sensor measures reports the atmospheric pressure in hPa (hectopascal). Unfortunately, we can not provide any data on this sensor, since the phone used does not support this sensor. However, due to its declaration as a sensor with an
Continuous
reporting mode, we can deduce that it has most likely a similar behaviour like the accelerometer sensor or any other
Continuous
sensor.
Relative Humidity (On Change):
The relative humidity sensor measures the relative ambient humidity and returns its value as a percentage. Unfortunately, as declared for
Pressure
sensor we can not provide any data on this sensor, since the phone used does not support this sensor. However, due to its declaration as a sensor with an
On Change
reporting mode, we can deduce that it has most likely a similar behavior like the proximity sensor or any other
On Change
sensor.
6.4 - Special sensors
In this category we place the GPS sensor. In future versions of the OpenSignals mobile application we also plan to include the audio and camera/video sensors of an Android smartphone. These will also then be placed into this category. Android does not provide the reporting mode types for these sensors, however it can be assumed that these sensor are Continuous with their own system dependent sampling rates.
GPS (Continuous):
The GPS sensor provides information on the position of the user (or phone). This position information is given as the latitude and longitude. Some phone models also provide the altitude at which the user (or phone) currently is. If your phone does not support an altitude measurement, the third channel will be zero. The ".txt" file of the GPS sensor is shown below.
In this Jupyter notebook we learned the basics of internal Android sensor data that has been acquired with the OpenSignals mobile application . We discussed the limitations of the Android operating system concerning data acquisition and had a detailed look at each sensor that is supported by Android.
We hope that you have enjoyed this guide
.
biosiganlsnotebooks
is an environment in continuous expansion, so don"t stop your journey and learn more with the remaining
Notebooks
.